Proof Mining in Analysis: Computability and Complexity
نویسنده
چکیده
Proof Mining consists in extracting from prima facie non-constructive proofs constructive information. For that purpose many techniques mostly based on negative translation, realizability, functional interpretation, A-translation and combinations thereof have been developed. After introducing various different approaches to Proof Mining we present an application of ‘monotone functional interpretation’ to a modification of a classical proof from 1921 of uniqueness of L1-approximation of continuous functions by polynomials of degree ≤ n. From that proof we extracted the first effective uniform rate of strong uniqueness (also called ‘uniform modulus of uniqueness’) for L1-approximation, which existence had been studied for 80 years but always in a non-constructive manner. For that reason, previous results only stated the dependencies of the modulus but explicit constants had never been obtained. We also show how this modulus of uniqueness can be used to compute the actual best L1-approximation of the given continuous functions on a compact interval. Finally, we analyze the complexity of the resulting algorithm using the tools of Computable Analysis (cf. [Ko91] and [Wei00]). The first part of this work (the extraction of the uniform modulus of uniqueness) has been concluded, the last two parts are under development. This report also includes indications of future areas of work.
منابع مشابه
The computational content of Nonstandard Analysis
Kohlenbach’s proof mining program deals with the extraction of effective information from typically ineffective proofs. Proof mining has its roots in Kreisel’s pioneering work on the so-called unwinding of proofs. The proof mining of classical mathematics is rather restricted in scope due to the existence of sentences without computational content which are provable from the law of excluded mid...
متن کاملTechniques in weak analysis for conservation results
We review and describe the main techniques for setting up systems of weak analysis, i.e. formal systems of second-order arithmetic related to subexponential classes of computational complexity. These involve techniques of proof theory (e.g., Herbrand’s theorem and the cut-elimination theorem) and model theoretic techniques like forcing. The techniques are illustrated for the particular case of ...
متن کاملIntroduction to clarithmetic II
The earlier paper " Introduction to clarithmetic I " constructed an axiomatic system of arithmetic based on computability logic, and proved its soundness and extensional completeness with respect to polynomial time computability. The present paper elaborates three additional sound and complete systems in the same style and sense: one for polynomial space computability, one for elementary recurs...
متن کاملTime and Space Complexity Reduction of a Cryptanalysis Algorithm
Binary Decision Diagram (in short BDD) is an efficient data structure which has been used widely in computer science and engineering. BDD-based attack in key stream cryptanalysis is one of the best forms of attack in its category. In this paper, we propose a new key stream attack which is based on ZDD(Zero-suppressed BDD). We show how a ZDD-based key stream attack is more efficient in time and ...
متن کاملTime and Space Complexity Reduction of a Cryptanalysis Algorithm
Binary Decision Diagram (in short BDD) is an efficient data structure which has been used widely in computer science and engineering. BDD-based attack in key stream cryptanalysis is one of the best forms of attack in its category. In this paper, we propose a new key stream attack which is based on ZDD(Zero-suppressed BDD). We show how a ZDD-based key stream attack is more efficient in time and ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2001